American Journal of Infection Control
○ Elsevier BV
Preprints posted in the last 30 days, ranked by how well they match American Journal of Infection Control's content profile, based on 12 papers previously published here. The average preprint has a 0.01% match score for this journal, so anything above that is already an above-average fit.
Shinto, H.; Chowell, G.; Takayama, Y.; Ohki, Y.; Saito, K.; Mizumoto, K.
Show abstract
BackgroundIn long-term care facilities (LTCFs), close-contact identification often relies on staff recall and monitoring records because residents may be unable to self-report reliably. How these different record-generation processes relate to proximity-based sensor measurements in routine LTCF workflow remain unclear, and how such differences may influence contact-based decision-making in outbreak response is not well understood. MethodsWe conducted a five-day observational study in a Japanese LTCF using ultra-wideband (UWB) indoor positioning. Twenty-seven participants wore UWB tags, including 16 residents and 11 staff members; 10 staff members completed questionnaires. We compared UWB-derived proximity with questionnaire-derived contacts from staff self-report and monitoring-based proxy records, and assessed directional discrepancies under multiple distance-time thresholds. ResultsQuestionnaire-based records and UWB-derived proximity showed different patterns of discrepancy across contact types. Within this facility, resident-related monitoring-based proxy records showed relatively small directional discrepancies, whereas staff self-reports tended to identify additional resident-staff contacts under the baseline threshold ([≤]1.0 m for [≥]15 min). Several alternative thresholds were associated with discrepancies closer to zero than the baseline, although the apparent ranking varied by summary metric. ConclusionsIn this single-facility observational study, different contact-list generation processes were associated with different patterns of discrepancy relative to a proximity-based operational measure. These findings support interpretation in terms of workflow-specific contact-list generation rather than a single universally optimal threshold and may help inform facility-level review of contact identification practices in LTCFs. These findings support aligning contact identification strategies with facility-specific workflows to improve the feasibility and effectiveness of IPC practices in LTCFs.
Laskaris, Z.; Baron, S.; Markowitz, S. B.
Show abstract
ObjectivesRising temperatures are a major climate-related hazard for U.S. workers, increasing heat-related illness and a broad range of occupational injuries through indirect pathways often overlooked in economic evaluations. We examined the association between temperature and occupational injury and illness and quantified heat-attributable injuries (including illnesses) and costs in New York State. MethodsWe conducted a time-stratified case-crossover study of 591,257 workers compensation (WC) claims during the warm season (2016-2024). Daily maximum temperature was linked to injury date and county and modeled using natural cubic splines, with effect modification by industry and worker characteristics. ResultsInjury risk increased with temperature, becoming statistically significant at approximately 78{degrees}F. Relative to 65{degrees}F, injury odds increased to 1.06 (95% CI: 1.01-1.10) at 80{degrees}F, 1.12 (1.07-1.18) at 90{degrees}F, and 1.17 (1.11-1.23) at 95{degrees}F. Overall, 5.0% of claims (2,322 annually) were attributable to heat. At temperatures [≥]80{degrees}F, an estimated 1,729 excess injuries occurred annually, generating approximately $46 million in WC costs. An estimated $3.2 million to $36.1 million in medical expenditures were associated with incomplete claims, likely borne outside the WC system. ConclusionsThese findings demonstrate substantial economic costs not fully captured within WC and support workplace heat protections as a cost-containment strategy that can reduce health care spending and strengthen workforce resilience.
Chhabra, S.; Nair, S.; Bramley, A.; Chee, J. Y.; Vignesvaran, K.; See, D. R. E.; Sun, L. J.; Ching, A. H.; Li,, A. Y.; Kayastha, G.; Chetchotisakd, P.; Cooper, B. S.; Charani, E.; Mo, Y.
Show abstract
Background Antibiotic use is prevalent in hospitals, driving the emergence of drug-resistant pathogens. We investigated the contextual influences on antibiotic prescribing behaviour across hospitals in high, middle, and low-income countries in Asia with an aim to provide actionable insights to improve prescribing behaviour. Methods We conducted a large qualitative study across ten institutions in Singapore, Nepal, and Thailand. Semi-structured interviews and ethnographic observations involving physicians, nurses, pharmacists, and management staff were conducted. Data were analysed thematically using QSR NVivo 14. Findings A total of 194 interviews were conducted amongst physicians (54{middle dot}1%), nurses (19{middle dot}6%), pharmacists (12{middle dot}4%), and management staff (13{middle dot}9%). Structural factors such as limited microbiology laboratory capabilities, concerns about antibiotic quality, weak infection prevention and control policies, and the lack of relevant, updated guidelines were prominent drivers for prolonged and broad-spectrum antibiotics prescriptions. Where these system supports were in place, prescribing decisions were less defensive and more targeted, although prescriber responsibility and concerns about immediate patient deterioration continued to influence practice. Across settings, clinicians tended to prioritise short-term perceived benefits of antibiotic treatment over the longer-term risks of antimicrobial resistance.
Dovlatbekyan, N. M.; Ochakovskaya, I. N.; Penjoyan, A. G.; Durleshter, V. M.; Onopriev, V. V.; Avagimov, A. D.
Show abstract
Objective. To evaluate the effectiveness of a bundle of interventions involving a clinical pharmacologist aimed at changing surgeons approach to perioperative antibiotic prophylaxis (PAP) in an oncourology department. Materials and Methods. A single-center retrospective observational study was conducted. Data from 226 patients who underwent prostatectomy or nephrectomy in the oncourology department of Regional Clinical Hospital No. 2 (Krasnodar, Russia) between 2023 and 2025 were analyzed. Periods before (n=125) and after (n=101) the implementation of an Antimicrobial Stewardship (AMS) strategy bundle with active participation of a clinical pharmacologist (pre-authorization, audit with feedback, education, handshake stewardship) were compared. The primary endpoint was the proportion of surgeries performed in compliance with the PAP protocol. Secondary endpoints included the incidence of infectious complications, antibiotic consumption (DDD/100 bed-days), direct costs of antibacterial drugs, dynamics of the microbial landscape, and the Drug Resistance Index (DRI). Results. After AMS implementation, the proportion of surgeries performed in accordance with the PAP protocol increased from 0% to 47.7% for prostatectomies and to 55.6% for nephrectomies. The mean duration of antibiotic use decreased from 7 to 2 days (p<0.001). Antibiotic consumption decreased by 31.2%, and costs were reduced by a factor of 4.3. The proportion of ESKAPE organisms in the microbial profile decreased from 26.3% to 16.4%. There was no statistically significant increase in the frequency of infectious complications (2.4% vs. 3.0%; p=1.000) or mortality (0% in both groups). Conclusions. AMS implementation integrating a clinical pharmacologist into the oncourology department workflow significantly improved adherence to clinical guidelines, reduced irrational antibiotic use and financial costs without compromising patient safety. This approach can serve as a model for optimizing PAP in other surgical departments. Keywords: antibiotic prophylaxis, antimicrobial stewardship, drug resistance, clinical pharmacologist, cost-benefit analysis, oncourology
Ochakovskaya, I. N.; Onopriev, V. V.; Dovlatbekyan, N. M.; Zhuravleva, K. S.; Zamulin, G. Y.; Durleshter, V. M.
Show abstract
Objective. To evaluate the diagnostic and prognostic significance of C reactive protein (CRP) level dynamics within the first five days after surgery for the early detection of surgical site infections (SSI) and to identify independent risk factors, taking into account regional specifics of surgical management (types of surgeries, duration of procedures), as well as the local hospital microbial landscape. Materials and Methods. A single-center retrospective cohort analysis of data from 127 patients who underwent surgical procedures between 2022 and 2024 was conducted. CRP levels on postoperative days 1, 3, and 5 were assessed, and delta values were calculated. Descriptive statistics, ROC analysis, and multivariate logistic regression were used to identify predictors of SSI. Results. Patients with SSI lacked the physiological decrease in CRP levels by day 5. The most informative indicator was the CRP level on day 3: a threshold of >106 mg/L was associated with a high risk of SSI (AUC=0.76; sensitivity 85%, specificity 63%). Independent predictors of SSI included surgery duration (OR=1.015 per 1 min; p<0.001) and the increase in CRP between days 3 and 5 (delta CRP3-5: OR=1.027; p=0.023). A combined model (clinical parameters + CRP) demonstrated the highest predictive ability (AUC=0.79). Conclusion. Monitoring CRP dynamics, particularly on days 3 and 5, is a highly informative and accessible method for the early diagnosis of SSI. A CRP threshold of >100 mg/L on day 3 and its subsequent increase should serve as a trigger for in-depth diagnostic investigation and rationalization of antimicrobial therapy. Keywords: C reactive protein, postoperative complications, surgical site infection, antibiotic therapy, predictive factors, diagnosis
Kapos, I. P.
Show abstract
ABSTRACT Background: The UroLume endoprosthesis (AMS/Endo-care), commercially available 1988-2007 and FDA-approved in 1996, was positioned as a permanent minimally invasive solution for recurrent bulbar urethral stricture and benign prostatic hyperplasia (BPH). Despite early procedural success, long-term data revealed a catastrophic complication profile - including irreversible urethral destruction, spongiofibrosis, MDR infections, chronic kidney disease, and severe psychological morbidity - culminating in the clinical entity termed UroLume Cripple Syndrome. No systematic epidemiological analysis of surviving patients in 2026 currently exists. Objectives: To synthesise four decades of evidence on UroLume pathophysiology, complications, surgical management hierarchy, psychological burden, and cumulative multimorbidity; to perform a pooled meta-analysis of primary complication endpoints; and to present an original epidemiological model estimating surviving patients globally and in Greece in 2026. Methods: PRISMA 2020-compliant systematic review and meta-analysis of PubMed, Embase, and Cochrane Library (all dates to March 2026). Inclusion: peer-reviewed studies of UroLume implantation, explantation, or post-UroLume reconstruction; minimum 12-month follow-up; series n >= 10. Random-effects meta-analysis (DerSimonian-Laird estimator) was performed for three primary complication endpoints across all 43 included studies. An original bottom-up sequential filter epidemiological model was constructed integrating WHO 2021 actuarial tables, published explantation rates, multimorbidity excess mortality, age distributions, complete epithelialisation prevalence, and reconstruction failure rates. Results: Forty-three studies met inclusion criteria (n=3,847 patients). Pooled meta-analysis yielded: restenosis/tissue ingrowth 37.9% (95% CI 36.1%-39.8%, I2=0%); stent explantation 8.7% (95% CI 7.7%-9.8%, I2=0%); urinary incontinence 9.7% (95% CI 8.7%-10.9%, I2=0%). Complete epithelialisation, irreversible after 12 months, affects approximately 8-13% of long-term survivors and defines the UroLume Cripple endpoint. Post-UroLume buccal mucosa graft urethroplasty achieves 76.7% success at 5 years when explantation is feasible. Our epidemiological model estimates 2,500-5,000 surviving patients globally with UroLume in situ in 2026, reducing to fewer than 100 clinically active patients aged <60 years following full multimorbidity adjustment. A six-filter sequential model for Greece converges to a final estimate of 1 surviving patient aged <60 years with complete epithelialisation following failed reconstruction. Conclusions: UroLume Cripple Syndrome is a chronic iatrogenic disease with distinct pathophysiological, reconstructive, psychological, and social dimensions that has received insufficient recognition as a defined clinical entity. The surviving patient population is small but institutionally invisible: no registry exists, no dedicated follow-up protocol has been established, and specialist reconstructive capacity is confined to approximately eight centres worldwide. Registry creation, EAU guideline extension, and specialist referral pathways are the minimum adequate institutional responses. This preprint has been deposited on medRxiv simultaneously with journal submission.
Mills, E. A.; Bingham, R.; Nijman, R. G.; Sriskandan, S.
Show abstract
BackgroundAn upsurge in Streptococcus pyogenes infections 2022-2023 highlighted potential benefits of point-of-care tests (POCT) to support clinical pathways, prevent outbreaks, and optimise antibiotic use. ObjectivesWe conducted a pilot research study in a west London paediatric emergency department (ED) to determine whether a molecular POCT had potential to alter management in children who were also having a conventional throat swab taken for culture. MethodsChildren <16 years presenting to ED who had a throat swab requested by a clinician were invited to have a second swab taken for research purposes only. Clinical management was unaffected by the research swab result, which was processed using a molecular POCT that was not approved for use in the host NHS Trust. ResultsPrevalence of streptococcal infection was low during the study (May 2023-June 2025); swab positivity in symptomatic children was 12.8% (6/47). Overall, 38/49 (77.6%) participants who had throat swabs received antibiotics. Of those children recommended to receive antibiotics, 29/38 (76.3%) had a negative POCT. Mean time to reporting of positive throat swab culture results was 3.67 days (range 3-5 days) leading to occasional delay in treatment, although POCT identified positive results within minutes. ConclusionAntibiotic use was frequent and could be avoided or stopped by use of a rule out POCT in over three-quarters of children in the ED, if suspicion of S. pyogenes is the main driver for prescribing. POCT were easy to process and produced immediate results compared with culture, in theory enabling timely decision-making and avoiding treatment delay.
MWABU, A. K.; Mutai, W. C.; Jaoko, W.; Mwaniki, J. N.; kiiru, J. N.
Show abstract
Introduction: Antibiotic misuse is a major driver of antimicrobial resistance (AMR), contributing to an estimated 1.27 million deaths globally. In Kenya, inappropriate antibiotic use is shaped by health-seeking behaviors and sociodemographic factors. However, little is known about how adults with productive coughs seek and use antibiotics, or how sociodemographic factors underpin these practices. This study explored antibiotic-seeking pathways, usage patterns, and the sociodemographic factors influencing these practices among adults with productive coughs attending selected chest and tuberculosis clinics in Nairobi County, Kenya. Methodology: A facility-based cross-sectional study was conducted among 400 adults ([≥]18 years) with productive coughs. Data were collected using a structured questionnaire on sociodemographic characteristics, antibiotic-seeking pathways, and use patterns. Results: Most participants were male (65.0%) and employed (67.0%), with 68.3% earning below Ksh 10,000 (approximately USD 80) monthly and 35.8% having basic education. A history of smoking (37.3%), tuberculosis (32.0%), or other comorbidities (29.8%) was common. Among 347 (86.7%) antibiotic users, 46.4% obtained antibiotics through general practitioners (GP) only, 31.4% via both GP and over-the-counter (OTC) sources, 15.3% from OTC only, and 6.9% through self-medication. Females were more likely to self-medicate (13.3% vs. 3.2%) and had higher odds of antibiotic use (cOR: 2.00; 95% CI: 1.04-4.10). Tuberculosis history was linked to greater GP reliance (61.7% vs. 37.4%). Low-income participants mainly used GP-only sources, while higher-income earners favored GP plus OTC routes (RRR: 2.67; 95% CI: 1.41-5.05). Empirical use was common (71.1%), dominated by Amoxicillin (90.8%), with multiple antibiotic use reported by 67.2% of the participants. Conclusion: Antibiotic use among adults with productive coughs in Nairobi was widespread and largely empirical, dominated by Amoxicillin and Amoxicillin/Clavulanic acid. Self-medication, unregulated antibiotic access, and inappropriate use highlight the urgent need for stricter prescription enforcement and strengthened stewardship programs to promote rational antibiotic use and curb AMR.
TANKPINOU ZOUMENOU, H.; Faucher, J.-F.
Show abstract
Background: Metronidazole (MTZ) is a first-line antibiotic for several enteric infections. Its use is common in low-income countries, where most primary-care consultations are conducted by nurses. However, increasing resistance among some enteric pathogens is a growing concern. Using WHO guidelines, we conducted a register-based cross-sectional study to assess MTZ prescribing practices and their determinants in public and private primary healthcare facilities in South Benin. Methods: We performed a register-based cross-sectional study covering the year 2020 in 11 primary healthcare facilities (5 public and 6 private) in Abomey-Calavi, South Benin, following WHO recommendations. In total, 200 visits per facility were selected using systematic random sampling. The primary outcome was the prevalence of MTZ prescription. Determinants of MTZ prescription were identified using multivariable logistic regression analysis. Results: In total, 2,200 medical visits were analyzed. The median age of patients was 19 years, and 57% were female. Antimalarials were prescribed in 52% of visits. Antibacterial agents were prescribed in the majority of visits, with MTZ being the second most frequently prescribed antibiotic (18%), after aminopenicillins (27%). In multivariable analysis, digestive symptoms (adjusted odds ratio [aOR], 8.65; 95% confidence interval [CI], 6.49-11.6), genitourinary symptoms (aOR, 6.84; 95% CI, 3.18-15.0), and skin lesions (aOR, 2.39; 95% CI, 1.58-3.60) were independently associated with increased odds of MTZ prescription. In contrast, fever (aOR, 0.66; 95% CI, 0.49-0.87), respiratory symptoms (aOR, 0.44; 95% CI, 0.26-0.71), and malaria (aOR, 0.21; 95% CI, 0.15-0.28) were associated with decreased odds. Visits in the private sector were also associated with higher odds of MTZ prescription compared with the public sector (aOR, 2.31; 95% CI, 1.78-3.02). Conclusion: MTZ is the second most commonly prescribed antibiotic in primary care in the study area, with its use largely driven by digestive symptoms. Further studies are needed to assess the appropriateness of this prescription. Additionally, research is warranted to understand better the determinants of higher antimicrobial prescribing in the private healthcare sector.
Welch, A. M.; Beseler, C. L.; Cross, S. T.
Show abstract
Purpose: Alpha-gal syndrome (AGS) is an emerging health issue. This syndrome, caused by the bites of ticks, induces allergic reactions to the sugar molecule galactose-alpha-1,3-galactose after exposure to non-primate mammalian meat and other byproducts. Agricultural workers spend significant time outdoors placing them at an increased risk for tick bites and tick-borne diseases, like AGS. This study aimed to characterize farmers and ranchers' prior knowledge, symptomology, and diagnostic experiences with AGS. Methods: We conducted a cross-sectional survey of more than 200 farmers and ranchers with a self-reported AGS diagnosis. The survey captured farmers and ranchers' experiences related to prior knowledge and experience with tick bites and AGS, reported symptoms, and obtaining a diagnosis. Findings: A total of 201 respondents across 26 states participated in the survey, with the majority from Missouri and Oklahoma. We identified four distinct symptom clusters, with the most reported symptoms being abdominal cramping, diarrhea, itchy skin, and nausea. Women more often reported gastrointestinal discomfort, and men were more likely to be in the mild symptom category. On average, participants reported 2.98 medical provider visits before receiving a diagnosis, most being diagnosed by general practitioners and allergists. Conclusions: No previous studies have focused on the symptom and diagnostic experiences of farmers and ranchers with AGS. Capturing such data is essential as these workers may experience unique occupational challenges following AGS diagnosis. The diagnostic experience data support a continuing need to educate and empower AGS patients and providers, especially agricultural workers and providers serving rural communities.
Kjaergaard, C.; Madeleine, P.; Dalboege, A.; Steinhilber, B.; Olesen, A. V.; Nielsen, T. K.
Show abstract
Background Trials in occupational populations, such as surgeons, face feasibility challenges due to high workload, restricted availability, and clinical heterogeneity, which may compromise recruitment, adherence, and retention. Objective To prespecify the feasibility framework and progression criteria for an internal pilot phase embedded within a pragmatic randomized controlled trial (RCT) comparing Mechanical Diagnosis and Therapy with generalized exercise in surgeons with chronic spinal pain. Design Protocol for a prespecified internal pilot phase embedded within a pragmatic, two-arm, parallel-group RCT. Methods The internal pilot will include the first four months of recruitment and aims to randomize at least 12 participants. Feasibility will be assessed across predefined domains, i.e., recruitment, eligibility, consent, intervention uptake, adherence, retention, data completeness, and treatment fidelity. Each domain is operationally defined and linked to prespecified progression criteria to ensure interpretability and decision-making utility. Criteria will be interpreted collectively to guide trial continuation. A minimal qualitative process evaluation will be embedded. Ethics and dissemination The host trial has received ethical approval (N-20240046) and is registered at ClinicalTrials.gov (NCT07293130). The findings from the internal pilot will be reported in a separate feasibility manuscript.
Sheth, E.; Case, L.; Shaw, F.; Dwyer, N.; Poland, J.; Wan, Y.; Larru, B.
Show abstract
Background Pseudomonas aeruginosa is a major cause of healthcare-associated infections in paediatric settings, where its persistence in moist environments such as hospital water and wastewater systems poses a particular risk to neonates and immunocompromised children. Aim The aim of this study was to showcase the long-term survival and transmission of P. aeruginosa in a large tertiary children's hospital in England which is crucial to develop strategies for water-safe care. Methods Environmental P. aeruginosa isolates were collected from taps, sinks, showers, and baths in augmented care areas of a 330-bed tertiary children's hospital built to NHS water-safety standards. Clinical isolates were classified as invasive (blood, cerebrospinal fluid, and bronchoalveolar lavage) or non-invasive (respiratory, urine, ear, abdominal, and rectal surveillance). Variable number tandem repeat (VNTR) profiles and metadata were extracted from PDF reports, de-identified, deduplicated, and curated using Python and R. Findings This retrospective study analysed nine-locus VNTR profiles of 457 P. aeruginosa isolates submitted to the UK Health Security Agency from a large tertiary children's hospital, identifying 56 isolate clusters (each with [≥]2 isolates), of which 19 (34%) contained at least one invasive isolate. The most persistent cluster (Cluster 1, n=20) spanned from July 2016 to September 2024, containing environmental and clinical (invasive and non-invasive) isolates. Conclusion These findings demonstrate long-term persistence of certain genotypes and temporal overlap between environmental and clinical isolates, highlighting the difficulty in detecting and eradicating P. aeruginosa in hospital water and wastewater systems and reinforcing the need for continuous rigorous water system controls.
Jamard, S.; Le Moal, g.; Plouzeau-Jayle, c.; Arvieux, C.; Ressier, S.; Lecomte, r.; Corvec, S.; Ansart, S.; Lamoureux, C.; Abgueguen, P.; Chenouard, R.; Lartigue, M. F.; Lemaignen, A.
Show abstract
Abstract Introduction: Streptococcus is the second genus involved in bone and joint infections (BJIs) after Staphylococcus. Streptococcus agalactiae is the predominant Streptococcus species implicated in BJIs. However, unlike Staphylococcus-related BJIs, data on S. agalactiae infections remain scarce. Methods: We conducted a retrospective cohort study from the West Region cohort of the CRIOAc registry among six university hospitals including all microbiologically confirmed streptococcal BJI in adults between 2014 and 2023. Results: 1454 patients were included, with a median age of 67 years and 65% male. S. agalactiae was the predominant streptococcal species involved 423/1454(29%). The most prevalent comorbidities identified were obesity (378/1454;26%) and diabetes mellitus (343/1454;24%). Prosthetic joint infections (PJIs) were the most common (653/1454;45%), although diabetic foot osteitis was less prevalent overall, it was significantly more associated with S. agalactiae infections (48/423;11% versus 70/1031;7%, p=0.05). S. agalactiae BJIs were more frequently lower-limb infections and chronic infections (240/423;57% versus 502/1031;49%, p=0.04). Half of the cohort had a polymicrobial infection and were slightly more frequent with S. agalactiae BJIs (235/423;56% versus 498/1031;48%, p=0.1). These results were consistent with a sensitivity analysis excluding diabetic foot related osteitis. Logistic regression analysis identified arteriopathy (OR: 4.16; IC95:1.64-11.24, p=0.003), and obesity (OR: 2.57; IC95: 1.41-4.78, p=0.002) as specific risk factors for S. agalactiae BJIs. Conclusion: S. agalactiae emerges as a prominent and distinct pathogen in complex streptococcal BJIs, with specific risk factors such as arteriopathy, obesity and diabetes mellitus, and more chronic infections.
Matuli, C.; Waeni, J. M.; Gicheru, E. T.; Sande, C. J.; Gallagher, K.
Show abstract
BackgroundTo date, accessible diagnostic tools to identify whether a patients pneumonia is a bacterial, or viral infection, are not accurate or timely enough to prevent preemptive antibiotic administration. Relying on single biomarkers or clinical presentations has been insufficient. We aimed to incorporate a wide range of novel biomarkers and clinical presentations in a multivariable model and validate its capacity to differentiate cases of bacterial and viral pneumonia. MethodsData from 457 children aged 2-59 months, admitted to Kilifi County Referral Hospital, Kenya, with bacterial (n = 229) and viral (n = 228) infections, were used to develop and validate a predictive multivariable Poisson regression model to differentiate pneumonia etiology. The Receiver Operating Characteristic curve was used to assess biomarker performance and validate the model internally. ResultsSixty-three percent (63%) of the children presented with severe pneumonia. 72% with viral pneumonia had severe pneumonia, compared to 54% with bacterial pneumonia who had severe pneumonia. In crude analyses, chest-wall indrawing, cough, convulsions, crackles, angiotensinogen, and Serpin Family A Member 1 were significantly associated with pneumonia etiology, controlling for age. However, only chest-wall indrawing remained significant in multivariable analyses after controlling for age. The model demonstrated fair, but inadequate, discrimination, with an Area Under the Curve of 0.61. ConclusionAmong the children admitted to hospital with WHO defined pneumonia, a wide range of biomarkers and clinical presentations still failed to distinguish bacterial from viral pneumonia.
Ukah, C. E.; Tendongfor, N.; Hubbard, A.; Tanue, E. A.; Oke, R.; Bassah, N.; Yunika, L. K.; Ngu, C. N.; Christie, S. A.; Nsagha, D. S.; Chichom-Mefire, A.; Juillard, C.
Show abstract
BackgroundCommercial motorcycle riders are among the most vulnerable road users in low- and middle-income countries and contribute substantially to the burden of road traffic injuries. The use of personal protective equipment (PPE), including helmets and protective clothing, reduces injury severity; however, uptake remains suboptimal. This study evaluated the effectiveness of a theory-driven health education intervention in improving knowledge, attitudes, and use of PPE among commercial motorcycle riders in Cameroon. MethodsA quasi-experimental, non-randomized controlled before-and-after study was conducted in Limbe (intervention) and Tiko (control) Health Districts between August 4, 2024, and April 6, 2025. Participants were recruited from a cohort of commercial motorcycle riders and followed over an eight-month intervention period. The intervention, guided by the Health Belief Model and developed using the Intervention Mapping framework, combined face-to-face sensitization sessions with mobile phone-based educational messaging adapted to participants literacy levels and communication preferences. Data were collected at baseline and endline using structured questionnaires and direct observation checklists. Intervention effects were estimated using difference-in-differences analysis with generalized estimating equations, adjusting for socio-demographic factors. ResultsA total of 313 riders were enrolled at baseline (183 intervention, 130 control), with 249 retained at endline (149 intervention, 100 control). The intervention was associated with significant improvements in PPE knowledge ({beta} = 2.91; 95% CI: 2.14-3.68; p < 0.001) and attitudes ({beta} = 5.76; 95% CI: 4.32-7.21; p < 0.001) compared with the control group. No statistically significant effect was observed for PPE practice scores ({beta} = 0.21; 95% CI: -0.09-0.52; p = 0.171). Among individual PPE items, helmet use increased significantly in the intervention group relative to the control group (AOR = 2.38; 95% CI: 1.19-9.45; p = 0.036), while no significant effects were observed for gloves, trousers, eyeglasses, or closed-toe shoes. ConclusionThe theory-driven health education intervention significantly improved knowledge and attitudes toward PPE and increased helmet use among commercial motorcycle riders but did not lead to broader improvements in the uptake of other protective equipment. These findings highlight the need for complementary structural and policy interventions to address persistent barriers to PPE use in similar low-resource settings. Trial registrationClinicalTrials.gov Identifier: NCT07087444 (registered July 28, 2025, retrospectively)
Gallardo Mejia, A.; Almeida, J.
Show abstract
Urinary tract infections (UTIs) are among the most common infectious diseases worldwide, with Escherichia coli being the predominant uropathogen. The increasing prevalence of extended-spectrum beta-lactamase (ESBL)-producing strains and their association with fluoroquinolone resistance pose a significant challenge to empirical therapy, particularly in community settings. The aim of this study was to determine the epidemiology and predictive factors associated with ESBL-producing E. coli and its concomitant fluoroquinolone resistance in community-acquired clinical isolates. A retrospective cross-sectional study was conducted analyzing 244 clinical E. coli isolates. Demographic and microbiological data were collected, including age, sex, sample type, and antibiotic susceptibility. Associations between variables and ESBL production were assessed using Pearsons chi-squared test, and odds ratios (ORs) with 95% confidence intervals (CIs) were calculated. Of the isolates, 165 (68%) were ESBL-producing. A significant association was observed between age group and ESBL production (p < 0.001), with the highest frequency in the 20-39 age group. Most ESBL-positive isolates were obtained from women (73%), although odds ratio (OR) analysis suggested a non-significant trend toward a higher probability in men (OR = 1.29; 95% CI: 0.72-2.31). High rates of fluoroquinolone resistance were identified among the ESBL-producing isolates, with 30% resistance to levofloxacin and 35% to ciprofloxacin (p < 0.001). Urine samples showed the highest concentration of ESBL-positive isolates, with a significant association between sample type and resistance (p < 0.001). The high prevalence of ESBL-producing E. coli and its concomitant resistance to fluoroquinolones highlight a critical challenge for the empirical treatment of urinary tract infections in Mexico, underscoring the need to strengthen antimicrobial use management and local surveillance strategies.
Farre, R.; Salama, R.; Rodriguez-Lazaro, M. A.; Kiarostami, K.; Fernandez-Barat, L.; Oliveira, V. D. C.; Torres, A.; Farre, N.; Dinh-Xuan, A. T.; Gozal, D.; Otero, J.
Show abstract
BackgroundThe COVID-19 pandemic exposed critical shortages of mechanical ventilators, particularly in low-resource settings. Disruptions in global supply chains and dependence on specialized components highlighted the need for scalable, locally manufacturing alternatives for emergency respiratory support. AimTo describe and evaluate a simplified, supply-chain-independent mechanical ventilator assembled from widely available automotive and simple hardware components, and intended as a last-resort solution. MethodsThe ventilator is based on a reciprocating air pump driven by an automotive windshield wiper motor coupled to parallel shaft bellows and readily assembled passive membrane valves, only requiring materials available from standard hardware retailers, minimal tools, and basic manual skills. Ventilator performance was assessed through bench testing using a patient model simulating severe lung disease in an adult (R=20 cmH2O{middle dot}s/L, C=15 mL/cmH2O) and pediatric (R=50 cmH2O{middle dot}s/L, C=10 mL/cmH2O) patients. Realistic proof of concept was performed in four mechanically ventilated 50-kg pigs. ResultsThe device delivered tidal volumes up to 600 mL and respiratory rates up to 45 breaths/min with PEEP up to 10 cmH2O, covering pediatric and adult ventilation ranges. In vivo testing showed that the ventilator maintained arterial blood gases within the targeted range. Technical details for ventilator construction are provided in an open-source video tutorial. DiscussionThis low-cost ventilator demonstrated adequate performance under demanding conditions. Although not a substitute for commercial intensive care ventilators, its simplicity, autonomy, and independence from fragile supply chains provide a potentially life-saving option in resource-constrained emergency scenarios.
Huang, C.-H. S.; Kuehne, L. M.; Jacuzzi, G.; Olden, J. D.; Seto, E.
Show abstract
Military aviation training noise remains understudied despite its widespread impacts across urban, rural, and wilderness areas. The predominance of low-frequency noise and repetitive training can create pervasive noise pollution, yet past research often fails to capture the full range of health and quality-of-life effects. This study analyzed two complaint datasets related to Whidbey Island Naval Air Station noise: U.S. Navy records (2017-2020) and Quiet Skies Over San Juan County data (2021-2023). We analyzed and mapped sentiment intensity from noise complaints relative to modeled annual noise exposure, developed a typology to classify impacts, and modeled the environmental and operational factors influencing complaints. Findings revealed widespread negative sentiment and anger, often beyond the bounds of estimated noise contours, suggesting that annual cumulative noise models inadequately estimate community impacts. Complaints consistently highlighted sleep disturbance, hearing and health concerns, and compromised home environments due to shaking, vibration, and disruption of daily life. Residents also reported significant social, recreational, and work disruptions, along with feelings of fear, helplessness, and concern for children's well-being. The number of complaints were strongly associated with training schedules, with late-night sessions being the strongest predictor. A delayed response pattern suggests residents reach a frustration threshold before filing complaints. Overall, our findings demonstrate persistent negative sentiment and diverse impacts from military aviation noise. Results highlight the need for improved noise metrics, modeling and operational adjustments to mitigate the most disruptive effects.
Kamulegeya, R.; Nabatanzi, R.; Semugenze, D.; Mugala, F.; Takuwa, M.; Nasinghe, E.; Musinguzi, D.; Namiiro, S.; Katumba, A.; Ssengooba, W.; Nakatumba-Nabende, J.; Kivunike, F. N.; Kateete, D. P.
Show abstract
BackgroundTuberculosis (TB) remains a leading cause of infectious disease mortality worldwide, and treatment failure contributes to ongoing transmission, drug resistance, and poor clinical outcomes. Artificial intelligence and machine learning approaches have attracted growing interest for predicting tuberculosis treatment outcomes, but the literature is heterogeneous and lacks a comprehensive synthesis. MethodsWe conducted a systematic review and meta-analysis of studies that developed or validated machine learning models to predict TB treatment failure. We searched PubMed/MEDLINE and Embase from January 2000 to October 2025. Studies were eligible if they developed, validated, or implemented an artificial intelligence or machine learning model for the prediction of TB treatment failure or a closely related poor outcome in patients receiving anti-TB treatment. Risk of bias was assessed using the Prediction model Risk Of Bias Assessment Tool. Random-effects meta-analysis was performed to pool area under the curve values, with subgroup analyses and meta-regression to explore heterogeneity. ResultsThirty-four studies were included in the systematic review, of which 19 reported area under the curve values suitable for meta-analysis (total participants, 100,790). Studies were published between 2014 and 2025, with 91% published from 2019 onward. Tree-based methods were the most common algorithm family (52.9%), and multimodal models integrating three or more data types were used in 41.2% of studies. The pooled area under the curve was 0.836 (95% confidence interval 0.799-0.868), with substantial heterogeneity (I{superscript 2} = 97.9%). In subgroup analyses, studies including HIV-positive participants showed lower discrimination (pooled area under the curve 0.748) compared to those excluding them (0.924). Only eight studies (23.5%) performed external validation, and only one study (2.9%) was rated as low risk of bias overall, primarily due to methodological concerns in the analysis domain. Eggers test suggested publication bias (p = 0.024). Major evidence gaps included underrepresentation of high-burden countries, HIV-affected populations, social determinants, pediatric TB, and extrapulmonary disease. ConclusionsMachine learning models for predicting TB treatment failure show promising discrimination but are not yet ready for routine clinical implementation. Performance varies substantially across populations and settings, and methodological limitations, including inadequate validation, poor calibration assessment, and high risk of bias, limit confidence in current estimates. Future research should prioritize rigorous external validation, calibration assessment, and development in underrepresented populations, particularly HIV-affected and high-burden settings. Author SummaryTB kills over a million people annually. While curable, treatment failure remains common and drives ongoing transmission and drug resistance. Researchers increasingly use artificial intelligence and machine learning to predict which patients will fail treatment, but it is unclear if these models are ready for clinical use. We reviewed 34 studies including nearly 1.1 million participants from 22 countries. On average, models correctly distinguished patients who would fail treatment from those who would not 84% of the time, a performance generally considered good. However, this average hid enormous variation. Models developed in populations including HIV-positive people performed substantially worse, suggesting prediction is harder with HIV co-infection. Worryingly, only one study used high-quality methods; 97% had serious flaws in handling missing data, checking calibration, or testing in new populations. Only eight studies validated their models in different settings. To conclude, we found that machine learning is promising in predicting TB treatment failure, but it is not ready for clinical use. Researchers should prioritize validation in high-burden settings, include social determinants, and improve methodological rigor before these tools can help patients.
Gohari, M. R.; Zhang, P.; Villegas, A.; Rosella, L. C.; Patel, S. N.; Hopkins, J. P.; Duvvuri, V. R.
Show abstract
Antimicrobial resistance (AMR) is a growing global public health threat that complicates the treatment and control of bacterial infections. Shigella spp., a leading cause of bacterial diarrhea worldwide, has increasingly exhibited resistance to multiple antimicrobial agents that are commonly recommended therapy for severe shigellosis. Although conventional antimicrobial susceptibility testing (AST) remains the reference standard, it is time-consuming and provides limited insight into the genetic mechanisms underlying resistance. Whole-genome sequencing (WGS) has emerged as a complementary approach for AMR detection by enabling direct identification of resistance genetic determinants encoded in bacterial genomes. Machine learning (ML) methods applied to genomic features such as k-mers have shown promise for predicting resistance phenotypes from WGS data; however, applications to Shigella remain limited. In this study, we developed and evaluated an interpretable ML framework for predicting ciprofloxacin resistance using k-mer features derived from WGS data of 1,424 Shigella isolates collected in Ontario, Canada, between 2018 and 2025. K-mers were extracted from known gene targets associated with ciprofloxacin resistance, including chromosomal quinoline resistance-determining regions (QRDRs: gyrA and parC) and plasmid-mediated determinants (qnr). Supervised ML approaches were trained and compared. We evaluated the influence of k-mer lengths (k=11, 15, 21 and 31) on predictive performance and model interpretability; and compared models based on chromosomal determinants alone and models incorporating both chromosomal and plasmid-mediated determinants. Randon Forest classifier achieved the most consistent performance across models. Inclusion of plasmid-mediated determinants improved predictive accuracy relative to chromosomal-only models. Although differences across k-mer lengths were modest, k = 11 produced the highest area under the receiver operating characteristic curve (AUC) and the lowest Brier score. SHAP analyses localized high-impact features within QRDRs of gyrA and parC, supporting biological interpretability. These findings demonstrate that biologically-informed k-mer-based ML models can accurately and transparently predict ciprofloxacin resistance in Shigella, supporting their potential integration into genomic AMR surveillance and digital public health frameworks. Author summaryIn this study, we used genome sequencing data to develop machine learning models that predict ciprofloxacin resistance for Shigella directly from bacterial DNA. We focused on small DNA fragments (k-mers) derived from known resistance genes and mutations. Among the approaches tested, a Random Forest model showed the most consistent performance. Combining chromosomal mutations with plasmid-mediated resistance genes improved prediction accuracy and helped identify key genetic regions associated with resistance. These findings demonstrate that machine learning applied to genomic data can accurately and interpretable predict antibiotic resistance, supporting its potential use in genomic surveillance and public health monitoring.